18,138 research outputs found

    Commentary: Both/And: A Response to De(fence)/Defense

    Get PDF
    In this paper we introduce non-dualism and begin by answering the questions posed by the editors of this journal. We address the theme of de(fence) and propose a paradigmatic shift. For many years, art teachers have advocated tirelessly in defense of the field, fighting for funding and legitimacy in an educational landscape that prioritizes other subjects. While the reaction to fight is appropriate, art reveals another way. It aids us in our task of living in the liminal, and it gives us the chance to suspend our judgments and forego meaning in favor of experience. Art can help us transition from the dual mind to a non-dualistic awareness. When we experience art as it is, we stop seeing differences and start to see connections

    How to Train a CAT: Learning Canonical Appearance Transformations for Direct Visual Localization Under Illumination Change

    Full text link
    Direct visual localization has recently enjoyed a resurgence in popularity with the increasing availability of cheap mobile computing power. The competitive accuracy and robustness of these algorithms compared to state-of-the-art feature-based methods, as well as their natural ability to yield dense maps, makes them an appealing choice for a variety of mobile robotics applications. However, direct methods remain brittle in the face of appearance change due to their underlying assumption of photometric consistency, which is commonly violated in practice. In this paper, we propose to mitigate this problem by training deep convolutional encoder-decoder models to transform images of a scene such that they correspond to a previously-seen canonical appearance. We validate our method in multiple environments and illumination conditions using high-fidelity synthetic RGB-D datasets, and integrate the trained models into a direct visual localization pipeline, yielding improvements in visual odometry (VO) accuracy through time-varying illumination conditions, as well as improved metric relocalization performance under illumination change, where conventional methods normally fail. We further provide a preliminary investigation of transfer learning from synthetic to real environments in a localization context. An open-source implementation of our method using PyTorch is available at https://github.com/utiasSTARS/cat-net.Comment: In IEEE Robotics and Automation Letters (RA-L) and presented at the IEEE International Conference on Robotics and Automation (ICRA'18), Brisbane, Australia, May 21-25, 201

    What Ways Can We Use Big Data to Offer More Personalized and Tailored HR Services to our Employees?

    Get PDF
    Big data analytics—analytic techniques operating on big data—is continuing to disrupt the way decision-making is occurring. Instead of relying on intuition, decisions are made based on statistical analysis, emerging technologies and massive amounts of current and historical data. Predictive analytics, which will be featured in much of the research below, is a type of big data analytics that predicts an outcome by correlating the relationships of various factors. These predictions can be made utilizing a variety of organized structured data and disorganized unstructured data (i.e. social media posts, surveys, etc.

    Description, measurement and analysis of glacitectonically deformed sequences

    Get PDF

    The application of regional-scale geochemical data in defining the extent of aeolian sediments : the Late Pleistocene loess and coversand deposits of East Anglia, UK

    Get PDF
    The ‘European Coversand Sheet’ is a discontinuous ‘sheet’ of aeolian (windblown) loess and coversand that extends through eastern and southern England, across the English Channel into northern France, Belgium and the Netherlands (Kasse, 1997; Antoine et al., 2003). Whilst some of the earlier aeolian sediments date from the Middle Pleistocene, most correspond to the Late Pleistocene Weichselian / Devensian and earliest Holocene stages. East Anglia contains considerable accumulations of aeolian sediment. Although several valuable studies have attempted to determine the spatial extent of aeolian material (e.g. Catt, 1977, 1985), defining their margins has proved largely difficult because aeolian material is highly susceptible to reworking and removal by various natural and anthropogenic agents. Within this study, we use regional‐scale geochemical data from soils to reconstruct the extent of aeolian sediments in East Anglia. A specific geochemical signature, defined by elevated concentrations of Hafnium (Hf) and Zirconium (Zr), is strongly characteristic of soils developed on aeolian deposits within the United States, China, Europe and New Zealand (Taylor et al., 1983). The data suggests that the approach is sufficiently sensitive to identify a residual aeolian component within soils even where deposits may be thin and unmappable by conventional methods, or if the material has been largely eroded

    Computation in generalised probabilistic theories

    Get PDF
    From the existence of an efficient quantum algorithm for factoring, it is likely that quantum computation is intrinsically more powerful than classical computation. At present, the best upper bound known for the power of quantum computation is that BQP is in AWPP. This work investigates limits on computational power that are imposed by physical principles. To this end, we define a circuit-based model of computation in a class of operationally-defined theories more general than quantum theory, and ask: what is the minimal set of physical assumptions under which the above inclusion still holds? We show that given only an assumption of tomographic locality (roughly, that multipartite states can be characterised by local measurements), efficient computations are contained in AWPP. This inclusion still holds even without assuming a basic notion of causality (where the notion is, roughly, that probabilities for outcomes cannot depend on future measurement choices). Following Aaronson, we extend the computational model by allowing post-selection on measurement outcomes. Aaronson showed that the corresponding quantum complexity class is equal to PP. Given only the assumption of tomographic locality, the inclusion in PP still holds for post-selected computation in general theories. Thus in a world with post-selection, quantum theory is optimal for computation in the space of all general theories. We then consider if relativised complexity results can be obtained for general theories. It is not clear how to define a sensible notion of an oracle in the general framework that reduces to the standard notion in the quantum case. Nevertheless, it is possible to define computation relative to a `classical oracle'. Then, we show there exists a classical oracle relative to which efficient computation in any theory satisfying the causality assumption and tomographic locality does not include NP.Comment: 14+9 pages. Comments welcom
    corecore